Quasi-Gradient Nonlinear Simplex Optimization Method in Electromagnetics

نویسندگان

چکیده

Particle swarm optimization (PSO), genetic algorithm (GA), and nonlinear simplex method (SOM) are some of the most prominent gradient-free algorithms in engineering. When it comes to a common group electromagnetic problems wherein less than 10 parameters present problem domain, SOM features faster convergence rate vs PSO GA. Nevertheless, GA still outperform by having more accuracy finding global minimum. To improve with few parameters, quasi-gradient (Q-G) search direction is added conventional algorithm. An extra decision made proposed move alongside reflection or during error-reduction operations. This modification will SOM, which otherwise fails examples presented this article, levels similar GA, while retaining approximately 33% speed relatively small number 20% larger parameters. Following standard benchmark test verification, successfully solves suite problems. Representative include absorber dimensions an anechoic chamber, estimation properties unknown embedded object scattered microwave signals.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Simplex Algorithm - Gradient Projection Method for Nonlinear Programming

W i tzgal l [ 7 L comment ing on the gradient project ion methods of R . Frlsch and J . B . Rosen , states: "More or less a l l algori thms for solving the l inear programming problem are known to be modificat ions of an algori thm for matrix inversion . Thus the simplex method corresponds to the Gauss-Jordan method . The methods of Frisch and Rosen are based on an interest ing method for inver...

متن کامل

A New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization

In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.

متن کامل

The Simplex Gradient and Noisy Optimization Problems

Many classes of methods for noisy optimization problems are based on function information computed on sequences of simplices. The Nelder-Mead, multidirectional search, and implicit filtering methods are three such methods. The performance of these methods can be explained in terms of the difference approximation of the gradient implicit in the function evaluations. Insight can be gained into ch...

متن کامل

Derivative-free nonlinear optimization filter simplex

The filter method is a technique for solving nonlinear programming problems. The filter algorithm has two phases in each iteration. The first one reduces a measure of infeasibility, while in the second the objective function value is reduced. In real optimization problems, usually the objective function is not differentiable or its derivatives are unknown. In these cases it becomes essential to...

متن کامل

Augmented Downhill Simplex a Modified Heuristic Optimization Method

Augmented Downhill Simplex Method (ADSM) is introduced here, that is a heuristic combination of Downhill Simplex Method (DSM) with Random Search algorithm. In fact, DSM is an interpretable nonlinear local optimization method. However, it is a local exploitation algorithm; so, it can be trapped in a local minimum. In contrast, random search is a global exploration, but less efficient. Here, rand...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2023

ISSN: ['2169-3536']

DOI: https://doi.org/10.1109/access.2023.3285602